771 research outputs found

    Robotic motion learning framework to promote social engagement

    Get PDF
    Abstract Imitation is a powerful component of communication between people, and it poses an important implication in improving the quality of interaction in the field of human–robot interaction (HRI). This paper discusses a novel framework designed to improve human–robot interaction through robotic imitation of a participant’s gestures. In our experiment, a humanoid robotic agent socializes with and plays games with a participant. For the experimental group, the robot additionally imitates one of the participant’s novel gestures during a play session. We hypothesize that the robot’s use of imitation will increase the participant’s openness towards engaging with the robot. Experimental results from a user study of 12 subjects show that post-imitation, experimental subjects displayed a more positive emotional state, had higher instances of mood contagion towards the robot, and interpreted the robot to have a higher level of autonomy than their control group counterparts did. These results point to an increased participant interest in engagement fueled by personalized imitation during interaction

    Musical Robots For Children With ASD Using A Client-Server Architecture

    Get PDF
    Presented at the 22nd International Conference on Auditory Display (ICAD-2016)People with Autistic Spectrum Disorders (ASD) are known to have difficulty recognizing and expressing emotions, which affects their social integration. Leveraging the recent advances in interactive robot and music therapy approaches, and integrating both, we have designed musical robots that can facilitate social and emotional interactions of children with ASD. Robots communicate with children with ASD while detecting their emotional states and physical activities and then, make real-time sonification based on the interaction data. Given that we envision the use of multiple robots with children, we have adopted a client-server architecture. Each robot and sensing device plays a role as a terminal, while the sonification server processes all the data and generates harmonized sonification. After describing our goals for the use of sonification, we detail the system architecture and on-going research scenarios. We believe that the present paper offers a new perspective on the sonification application for assistive technologies

    Vision-based force guidance for improved human performance in a teleoperative manipulation system

    Get PDF
    ©2007 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Presented at the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, Oct 29 - Nov 2, 2007.DOI: 10.1109/IROS.2007.4399119In this paper, we discuss a methodology that employs vision-based force guidance techniques for improving human performance with respect to a teleoperated manipulation system. The primary focus of the approach is to study the effectiveness of guidance forces in a haptic system to enable ease-of-use for human operators performing common manipulation activities necessary for achievement of everyday tasks. By designing force feedback signals constructed only from visual imagery data as input into a haptic device, we show the impact on human performance during the teleoperation sequence. The methodology is explained in detail, and results of implementation on object-centering and object-approaching tasks with our divided force guidance approach are presented

    Towards Real-Time Haptic Exploration using a Mobile Robot as Mediator

    Get PDF
    ©2010 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Presented at the 2010 IEEE Haptics Symposium, 25-26 March 2010, Waltham, MA.DOI: 10.1109/HAPTIC.2010.5444643In this paper, we propose a new concept of haptic exploration using a mobile manipulation system, which combines the mobility and manipulability of the robot with haptic feedback for user interaction. The system utilizes and integrates heterogeneous robotic sensor readings to create a real-time spatial model of the environment, which in turn can be conveyed to the user to explore the haptically represented environment and spatially perceive the world without direct contact. The real-world values are transformed into an environmental model (an internal map) by the sensors, and the environmental model is used to create environmental feedback on the haptic device which interacts in the haptically augmented space. Through this multi-scale convergence of dynamic sensor data and haptic interaction, our goal is to enable real-time exploration of the world through remote interfaces without the use of predefined world models. In this paper, the system algorithms and platform are discussed, along with preliminary results to show the capabilities of the system

    Requirement of estrogen receptor alpha DNA-binding domain for HPV oncogene-induced cervical carcinogenesis in mice

    Get PDF
    Cervical cancer is caused by human papillomavirus (HPV) in collaboration with other non-viral factors. The uterine cervix is hormone responsive and female hormones have been implicated in the pathogenesis of the disease. HPV transgenic mice expressing HPV16 oncogenes E6 ( K14E6 ) and/or E7 ( K14E7 ) have been employed to study a mechanism of estrogen and estrogen receptor ? (ER?) in cervical carcinogenesis. A chronic exposure to physiological levels of exogenous estrogen leads to cervical cancer in the HPV transgenic mice, which depends on ER?. The receptor is composed of multiple functional domains including a DNA-binding domain (DBD), which mediates its binding to estrogen-responsive elements (EREs) on target genes. A transcriptional control of genes by ER? is mediated by either DBD-dependent (classical) or DBD-independent (non-classical) pathway. Although molecular mechanisms of ER? in cancer have been characterized extensively, studies investigating importance of each pathway for carcinogenesis are scarce. In this study, we employ knock-in mice expressing an ER? DBD mutant (E207A/G208A) that is defective specifically for ERE binding. We demonstrate that the ER? DBD mutant fails to support estrogen-induced epithelial cell proliferation and carcinogenesis in the cervix of K14E7 transgenic mice. We also demonstrate that cervical diseases are absent in K14E7 mice when one ER? DBD mutant allele and one wild-type allele are present. We conclude that the ER? classical pathway is required for cervical carcinogenesis in a mouse model

    The Effects of Robot Voices and Appearances on Users\u27 Emotion Recognition and Subjective Perception

    Get PDF
    As the influence of social robots in people\u27s daily lives grows, research on understanding people\u27s perception of robots including sociability, trust, acceptance, and preference becomes more pervasive. Research has considered visual, vocal, or tactile cues to express robots\u27 emotions, whereas little research has provided a holistic view in examining the interactions among different factors influencing emotion perception. We investigated multiple facets of user perception on robots during a conversational task by varying the robots\u27 voice types, appearances, and emotions. In our experiment, 20 participants interacted with two robots having four different voice types. While participants were reading fairy tales to the robot, the robot gave vocal feedback with seven emotions and the participants evaluated the robot\u27s profiles through post surveys. The results indicate that (1) the accuracy of emotion perception differed depending on presented emotions, (2) a regular human voice showed higher user preferences and naturalness, (3) but a characterized voice was more appropriate for expressing emotions with significantly higher accuracy in emotion perception, and (4) participants showed significantly higher emotion recognition accuracy with the animal robot than the humanoid robot. A follow-up study (N=10) with voice-only conditions confirmed that the importance of embodiment. The results from this study could provide the guidelines needed to design social robots that consider emotional aspects in conversations between robots and users

    Special Issue on Assistive and Rehabilitation Robotics

    Full text link
    • …
    corecore